466 research outputs found
The Speed of Light and the Hubble Parameter: The Mass-Boom Effect
We prove here that Newtons universal gravitation and momentum conservation
laws together reproduce Weinbergs relation. It is shown that the Hubble
parameter H must be built in this relation, or equivalently the age of the
Universe t. Using a wave-to-particle interaction technique we then prove that
the speed of light c decreases with cosmological time, and that c is
proportional to the Hubble parameter H. We see the expansion of the Universe as
a local effect due to the LAB value of the speed of light co taken as constant.
We present a generalized red shift law and find a predicted acceleration for
photons that agrees well with the result from Pioneer 10/11 anomalous
acceleration. We finally present a cosmological model coherent with the above
results that we call the Mass-Boom. It has a linear increase of mass m with
time as a result of the speed of light c linear decrease with time, and the
conservation of momentum mc. We obtain the baryonic mass parameter equal to the
curvature parameter, omega m = omega k, so that the model is of the type of the
Einstein static, closed, finite, spherical, unlimited, with zero cosmological
constant. This model is the cosmological view as seen by photons, neutrinos,
tachyons etc. in contrast with the local view, the LAB reference. Neither dark
matter nor dark energy is required by this model. With an initial constant
speed of light during a short time we get inflation (an exponential expansion).
This converts, during the inflation time, the Plancks fluctuation length of
10-33 cm to the present size of the Universe (about 1028 cm, constant from then
on). Thereafter the Mass-Boom takes care to bring the initial values of the
Universe (about 1015 gr) to the value at the present time of about 1055 gr.Comment: 15 pages, presented at the 9th Symposium on "Frontiers of Fundamental
Physics", 7-9 Jan. 2008, University of Udine, Italy. Changed content
Weak nuclear forces cause the strong nuclear force
We determine the strength of the weak nuclear force which holds the lattices
of the elementary particles together. We also determine the strength of the
strong nuclear force which emanates from the sides of the nuclear lattices. The
strong force is the sum of the unsaturated weak forces at the surface of the
nuclear lattices. The strong force is then about ten to the power of 6 times
stronger than the weak force between two lattice points.Comment: 12 pages, 1 figur
Quantum Information and Wave function Collapse
Inofrmation-theoretical restrictions on information transferred in the
measurement of object S by information system O are studied. It is shown that
such constraints, induced by Heisenberg commutation relations, result in the
loss of information about the purity of S state. Consequently, it becomes
impossible for O to discriminate pure and mixed S states. In individual events
this effect is manifested by the stochastic outcomes of pure S state
measurement, i.e. the collapse of pure S state.Comment: 8 pages, talk given on Simposium 'Frontiers of fundamental Physics',
Udine, Italy, January 2008, to appear in Proceeding
Hidden-variable theory versus Copenhagen quantum mechanics
The main assumptions the Copenhagen quantum mechanics has been based on will
be summarized and the known (not yet decided) contradiction between Einstein
and Bohr will be newly analyzed. The given assumptions have been represented
basically by time-dependent Schroedinger equation, to which some further
assumptions have been added. Some critical comments have been raised against
the given mathematical model structure by Pauli (1933) and by Susskind and
Glogover (1964). They may be removed if only the Schroedinger equation is
conserved and the additional assumptions are abandoned, as shown recently. It
seems to be in contradiction to the numerous declarations that the Copenhagen
model has been approved by experimental results.
However, in the most of these experiments only the agreement with the mere
Schroedinger equation has been tested. All mentioned assumptions have been
tested practically only in the EPR experiment (measurement of coincidence light
transmission through two polarizers) proposed originally by Einstein (1935).
Also these experimental results have been interpreted as supporting the
Copenhagen alternative, which has not been, however, true. In fact the
microscopic world may be described correspondingly only with the help of the
hidden-variable theory that is represented by the Schroedinger equation without
mentioned additional assumptions, which has the consequence that the earlier
interpretation gap between microscopic and macroscopic worlds has been removed.
The only difference concerns the existence of discrete states. The
possibilities of the human reason of getting to know the nature will be also
shortly discussed in the beginning of this contribution.Comment: 10 pages, 2 figures; v2: local refinements and improvements of the
tex
Have Cherenkov telescopes detected a new light boson?
Recent observations by H.E.S.S. and MAGIC strongly suggest that the Universe
is more transparent to very-high-energy gamma rays than previously thought. We
show that this fact can be reconciled with standard blazar emission models
provided that photon oscillations into a very light Axion-Like Particle occur
in extragalactic magnetic fields. A quantitative estimate of this effect indeed
explains the observed data and in particular the spectrum of blazar 3C279.Comment: 3 pages, 1 figure, Proceeding of the "Eleventh International Workshop
on Topics in Astroparticle and Underground Physics" (TAUP), Roma, Italy, 1 -
5 July 2009 (to be published in the Proceedings
Biological Principles in Self-Organization of Young Brain - Viewed from Kohonen Model
Variants of the Kohonen model are proposed to study biological principles of
self-organization in a model of young brain. We suggest a function to measure
aquired knowledge and use it to auto-adapt the topology of neuronal
connectivity, yielding substantial organizational improvement relative to the
standard model. In the early phase of organization with most intense learning,
we observe that neural connectivity is of Small World type, which is very
efficient to organize neurons in response to stimuli. In analogy to human brain
where pruning of neural connectivity (and neuron cell death) occurs in early
life, this feature is present also in our model, which is found to stabilize
neuronal response to stimuli
Evidence for a New Light Boson from Cosmological Gamma-Ray Propagation?
An anomalously large transparency of the Universe to gamma rays has recently
been discovered by the Imaging Atmospheric Cherenkov Telescopes (IACTs)
H.E.S.S. and MAGIC. We show that observations can be reconciled with standard
blazar emission models provided photon oscillations into a very light
Axion-Like Particle occur in extragalactic magnetic fields. A quantitative
estimate of this effect is successfully applied to the blazar 3C279. Our
prediction can be tested with the satellite-borne Fermi/LAT detector as well as
with the ground-based IACTs H.E.S.S., MAGIC, CANGAROOIII, VERITAS and the
Extensive Air Shower arrays ARGO-YBJ andMILAGRO. Our result also offers an
important observational test for models of dark energy wherein quintessence is
coupled to the photon through an effective dimension-five operator.Comment: 11 pages, 2 figures, Proceeding of the Conference "Frontiers of
Fundamental and Computational Physics", AIP Conference Proceedings 1018
(2008
A novel background reduction strategy for high level triggers and processing in gamma-ray Cherenkov detectors
Gamma ray astronomy is now at the leading edge for studies related both to
fundamental physics and astrophysics. The sensitivity of gamma detectors is
limited by the huge amount of background, constituted by hadronic cosmic rays
(typically two to three orders of magnitude more than the signal) and by the
accidental background in the detectors. By using the information on the
temporal evolution of the Cherenkov light, the background can be reduced. We
will present here the results obtained within the MAGIC experiment using a new
technique for the reduction of the background. Particle showers produced by
gamma rays show a different temporal distribution with respect to showers
produced by hadrons; the background due to accidental counts shows no
dependence on time. Such novel strategy can increase the sensitivity of present
instruments.Comment: 4 pages, 3 figures, Proc. of the 9th Int. Syposium "Frontiers of
Fundamental and Computational Physics" (FFP9), (AIP, Melville, New York,
2008, in press
Organization of the Euclid Data Processing: Dealing with Complexity
The data processing development and operations for the Euclid mission (part of the ESA Cosmic Vision 2015-2025 Plan) is distributed within a Consortium composed of 14 countries and 1300+ persons: this imposes a high degree of complexity to the design and implementation of the data processing facilities. The focus of this paper is on the efforts to define an organisational structure capable of handling in manageable terms such a complexity
Open Transactions on Shared Memory
Transactional memory has arisen as a good way for solving many of the issues
of lock-based programming. However, most implementations admit isolated
transactions only, which are not adequate when we have to coordinate
communicating processes. To this end, in this paper we present OCTM, an
Haskell-like language with open transactions over shared transactional memory:
processes can join transactions at runtime just by accessing to shared
variables. Thus a transaction can co-operate with the environment through
shared variables, but if it is rolled-back, also all its effects on the
environment are retracted. For proving the expressive power of TCCS we give an
implementation of TCCS, a CCS-like calculus with open transactions
- …